Software Tools in High Energy Physics

نویسندگان

  • J. Swain
  • L. Taylor
چکیده

We review the software tools in High Energy Physics which may also be appropriate for use in the field of Cosmic Ray Physics. We describe the status and future plans for Object-Oriented (OO) Data Storage and Access; Event and Detector simulation, including physics generators and GEANT4; and the HEP class libraries for generic tasks such as data presentation, statistical and numerical analysis, and visualisation. Particular emphasis is placed on the current transition to an Object Oriented paradigm for the core software packages, such as the CERN program library, as well as the offline software of the major new experiments. INTRODUCTION There is a wide variety of mature software tools in use in High Energy Physics (HEP), which can also be used in the field of Cosmic Ray Physics. They include, but are not limited to, the following tasks: data storage and access; physics event simulation; detector simulation; statistical and numerical analysis; physics analysis; data presentation; and event and detector visualisation. In this paper we first describe the typical software and data model used by HEP experiments and then focus on specific tools, with particular emphasis on the future plans. SOFTWARE AND DATA MODEL Traditionally the preferred programming language of HEP has been Fortran with some use of other languages, such as C. Typically, experiments have a core of specific software for detector simulation, reconstruction, and analysis which draws on generic code residing in centrally maintained libraries such as CERNLIB1. Event data is typically stored in, and accessed from, a hierarchical structure using a data management package such as ZEBRA (Brun, 1997) or BOS (Blobel, 1988). Slow control data, such as detector calibrations, may be stored in a variety formats using ZEBRA, BOS, OracleTM, or even ASCII files. The traditional software and data model has deficiencies which make it difficult to meet the ever more demanding requirements (Swain and Taylor, 1997b) of such HEP experiments. In particular, the code is procedural such that the algorithms are distinct from the data themselves, resulting in great difficulties in maintaining such a large software base ( 105 6 lines of code) over a long period of time. The HEP community has almost universally accepted that future software developments should use an Object Oriented (OO) paradigm (Marino and Innocente, 1996) using C++ at least to start with. Objects are defined not only by their data but also by the services which they provide. To give a specific example, consider a track made by a particle in a detector. The corresponding track object would be able to return, upon request, the raw hits used in the track fit, the momentum of the track, and so on. Internally, the object may contain only the raw hits as data elements. The momentum may be determined by the object itself using code, known as a method, which operates within the object by fitting the raw hits. The important feature is that the interface to the object is well-defined such that the internal features may be modified and improved without the need to update every part of the code which relies on tracks. For example, it may become too slow to re-fit the track every time that a user wants to know the momentum, in which case the internals of the object could be modified to compute the momentum only once when the object is created and then store the result internally. Provided that 1Freely available from CERN (http://wwwinfo.cern.ch/asd/ on the World Wide Web). the interface is well defined, this process of encapsulation of both data and methods within objects makes for maintainable code. Within HEP OO models, the data will be stored in an object database (Duellmann, 1996), without making the historical distinction between events, slow control data, calibrations, etc. The treatment of data in a uniform way not only reduces maintenance but also opens up new possibilities for physics analysis. For example, in the CMS computing model, the serial data processing of the past, to form increasingly smaller and more abstracted data sets, will disappear at a logical level such that physicists may analyse raw data from one subdetector together with highly refined reconstructed objects from another. In the following sections, where we discuss specific HEP software tools, we will also describe the anticipated OO equivalents. EVENT AND DETECTOR SIMULATION Many programs are used to simulate high energy particle interactions. The program most prevalently used to simulate hadron collisions is PYTHIA (Sjöstrand, 1986; Sjöstrand and Bengtsson, 1987; Bengtsson and Sjöstrand, 1987; Sjöstrand, 1993). PYTHIA is a Fortran 77 program which simulates a wide variety of particle collisions and has hooks which facilitate the addition of new processes. While PYTHIA does not include all of the processes required to simulate high energy cosmic ray interactions and atmospheric showers, it does contain a wealth of pertinent and reliably simulated physics processes which could be profitably exploited to facilitate the development of simulation programs used in cosmic ray physics (Swain and Taylor, 1997a). The future evolution of PYTHIA is not yet defined, although it would be of great benefit to have a well designed OO version. PYTHIA, like most HEP generators, produces four-vectors of the final state particles in a standard Fortran common block format known as HEPEVT. These are then used as input to the detector simulation program which is usually based on the GEANT package (Brun, 1987). GEANT simulates the passage of particles through matter, allowing for: arbitrary geometry, materials, and magnetic field configuration; particle decays and interactions, with subsequent tracking of secondary particles; and the response of the detector. GEANT includes the following processes: pair creation; positron annihilation; Compton scattering; photoelectric effect; photofission; Rayleigh scattering; Čerenkov photons; Molière scattering; ionisation; delta-ray production; bremsstrahlung; particle decays; and others. It also provides tools for the graphical representation of the detector and the particle trajectories. The current Fortran 77 version of GEANT, known as GEANT3, is now a very mature and comprehensive program for the simulation of particle interactions in matter. In preparation for future HEP experiments, which will be using an OO paradigm for their software, CERN initiated the GEANT4 project (Giani, 1996; Allison, 1996) with the aim of completely rewriting GEANT in C++. While GEANT3 is already a powerful tool for the simulation of detectors in cosmic ray experiments, we believe that the impact of GEANT4 could be even greater. Firstly, GEANT4 aims to maintain precision over arbitrary scales (from microns to mega parsecs). Secondly, the modular design greatly facilitates the addition of arbitrary physics processes, which may be of particular interest to cosmic ray physicists. Thirdly, the ability to replace full step-by-step tracking and simulation of interactions with fast parametrisations in selected arbitrary volumes is built in. STATISTICAL AND NUMERICAL ANALYSIS Many aspects of physics analysis require reliable statistical and numerical software libraries. The most comprehensive and reliable library of numerical, statistical, and graphical algorithms in use in HEP is NAGLIB (NAG, 1997). The most widely used minimisation package in HEP is MINUIT (James, 1992). MINUIT is an extremely reliable and powerful generic program for minimising arbitrary functions in multi-dimensional parameter space. MINUIT may be used as a stand-alone package or in conjunction with generic analysis and data presentation packages such as mn fit (Brock, 1991) or PAW (Brun, 1989), which is described below. DATA ANALYSIS AND PRESENTATION The most commonly used generic data analysis and presentation package in use in HEP is PAW (Brun, 1989). PAW, which was developed at CERN, is a highly interactive Fortran 77 program for the manipulation and graphical presentation of data. Although PAW relies on other libraries within CERNLIB, it is extremely general, and is available for all major hardware platforms. Historically, PAW started as little more than an on-screen browser for histograms which supported simple manipulations such as creation, filling, and copying of histograms and control of the appearance to allow high quality hard-copies to be made. Since then, PAW has developed into a complete analysis environment due largely to the introduction of ntuples. An ntuple is similar to a spread sheet containing multiple variables of interest for each event, where an event is an arbitrarily defined entity. PAW permits entries in ntuples to be plotted in histograms, selected according to simple user-defined cuts or complex Fortran functions, or fitted using MINUIT. Histograms and ntuples may be created either by PAW or an independent Fortran program, and stored and accessed using the ZEBRA data management package (Brun, 1997) from CERN. In future, MINUIT and PAW will be replaced by more modular OO toolkits. In the former case, the algorithms will be preserved but the interfaces will be redesigned, using C++. Histograms and ntuples will be superseded by more generic objects whose persistent nature will permit them to be saved in an object database (Adesanya, 1996). The data presentation functions of PAW will almost certainly be replaced by those of a commercial OO package, such as Iris Explorer using Open Inventor as the graphics class library (Adesanya, 1996; NAG, 1997). In addition to the novel (for HEP) paradigm of OO programming, new paradigms for distributed computing are being developed very rapidly by commercial companies. These are likely to play an important role in the way in which we do analysis at a distance on an arbitrary hardware platform. Prototype projects using Java for HEP analysis (Coperchio, 1996; Johnson, 1996) already show the immense potential of this approach. EVENT AND DETECTOR VISUALISATION The ability to visualise HEP detectors and their response to physics events is of great importance, both in the design, construction, and data analysis phases of the experiments. The displayed data includes the detector elements, raw information such as hits and energy deposits, reconstructed objects such as tracks and jets, and monitoring data, such as dead or noisy detector elements. In addition to providing 3D Cartesian and abstracted views, the program should serve as a bi-directional interface to the data and reconstruction code and support such actions as: application of selection cuts; plotting; re-reconstruction; and database interrogation. Implementation of such features is probably the most difficult part of developing a display program since it requires a compatible and coherent design of the data structures and the reconstruction code, and especially close collaboration of the various software development teams. Throughout the last decade, most event and detector visualisation programs have been developed independently by each collaboration, using GKS or Phigs as the underlying graphics library and the Motif X11 toolkit for the graphical user interface. In 1995 the HEPVIS series of Workshops2 was initiated with the aim of reducing the duplication of effort through the use of common HEP and commercial solutions. It is now clear that a common approach is possible in the OO era, using Open Inventor (NAG, 1997) as the graphics class library which in turn relies on Open GLTM for the primitive graphics functions. SUMMARY We believe that many of the software tools currently in use in HEP are sufficiently mature and general for more widespread use, for example in the field of Cosmic Ray Physics. The benefits of the shift to 2See: http://www.cern.ch/Physics/Workshops/hepvis/ on the World Wide Web for details of these workshops and many pointers to graphics related Web servers. an OO paradigm is now widely accepted throughout the HEP community. The transition is necessary not only to remain in phase with commercial trends, but also to make the immense scale of the future projects manageable (Swain and Taylor, 1997b). The new tools which will be developed will be more modular and therefore more easily applied in other fields.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

DASTWAR: a tool for completeness estimation in magnitude-size plane

Today, great observatories around the world, devote a substantial amount of observing time to sky surveys. The resulted images are inputs of source finder modules. These modules search for the target objects and provide us with source catalogues. We sought to quantify the ability of detection tools in recovering faint galaxies regularly encountered in deep surveys. Our approach was based on com...

متن کامل

Concrete uses of XML in software development and data analysis

XML is now becoming an industry standard for data description and exchange. Despite this there are still some questions about how or if this technology can be useful in High Energy Physics software development and data analysis. This paper aims to answer these questions by demonstrating how XML is used in the IceCube software development system, data handling and analysis. It does this by rst s...

متن کامل

Validation of treatment planning system using simulation PRIMO code.

Introduction: In radiation therapy, in order to double-check the dosimetric results of the main treatment planning system (TPS), a distinct TPS, with few capacitances in terms of contouring and a variety of dose calculation algorithms is used. This system has the capability to double check the planification and the accurate prediction of dose distribution in order to be ensured...

متن کامل

The effect of high energy photon beam in dose- volume parameters and integral dose of normal structures in Head and neck IMRT

Introduction: The aim of this study is to investigate the effect of partially used high photon energies on dosimetric parameters and integral doses of nasopharynx IMRT plans.   Materials and Methods: Two 7 coplanar IMRT plan were created for eleven patients (6 MV only IMRT plan in both phase1 and phase2 and 6 MV phase1-mixed energies (6 MV and 15 MV) phase2...

متن کامل

Capabilities and Limitations of Energy Optimization Tools in Architectural Design Phase

Optimization tools as a method have gained vast application to achieve best results in reducing buildings’ energy consumption. In this paper optimization tools in conjunction with energy simulation software, as a powerful mechanism in design phase are studied. This is an applied research in nature and evaluates the capabilities and limitations of optimization algorithms in the beginning of the ...

متن کامل

Software tools at the Rome CMS/ECAL Regional Center

The construction of the CMS electromagnetic calorimeter is under way in Rome and at CERN. To this purpose, two Regional Centers were set up in both sites. In Rome, the project was entirely carried out using new software technologies such as object oriented programming, object databases, CORBA programming and web tools. It can be regarded as a Use Case for the evaluation of the benefits of new s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007